Generalization performance of regularization networks and support vector machines via entropy numbers of compact operators
نویسندگان
چکیده
We derive new bounds for the generalization error of kernel machines, such as support vector machines and related regularization networks by obtaining new bounds on their covering numbers. The proofs make use of a viewpoint that is apparently novel in the field of statistical learning theory. The hypothesis class is described in terms of a linear operator mapping from a possibly infinite-dimensional unit ball in feature space into a finite-dimensional space. The covering numbers of the class are then determined via the entropy numbers of the operator. These numbers, which characterize the degree of compactness of the operator, can be bounded in terms of the eigenvalues of an integral operator induced by the kernel function used by the machine. As a consequence, we are able to theoretically explain the effect of the choice of kernel function on the generalization performance of support vector machines.
منابع مشابه
Generalization Performance of Regularization Networks and Support Vector Machines via Entropy Numbers of Compact Operators Produced as Part of the Esprit Working Group in Neural and Computational Learning Ii, Neurocolt2 27150 Introduction 1
We derive new bounds for the generalization error of kernel machines, such as support vector machines and related regularization networks by obtaining new bounds on their covering numbers. The proofs make use of a viewpoint that is apparently novel in the eld of statistical learning theory. The hypothesis class is described in terms of a linear operator mapping from a possibly innnite dimension...
متن کاملGeneralization Performance of Regularization Networks and Support Vector Machines via Entropy Numbers of Compact Operators Produced as Part of the Esprit Working Group in Neural and Computational Learning Ii, Neurocolt2 27150
We derive new bounds for the generalization error of kernel machines, such as support vector machines and related regularization networks by obtaining new bounds on their covering numbers. The proofs make use of a viewpoint that is apparently novel in the eld of statistical learning theory. The hypothesis class is described in terms of a linear operator mapping from a possibly innnite dimension...
متن کاملEntropy Numbers, Operators and Support Vector Kernels
We derive new bounds for the generalization error of feature space machines, such as support vector machines and related regularization networks by obtaining new bounds on their covering numbers. The proofs are based on a viewpoint that is apparently novel in the field of statistical learning theory. The hypothesis class is described in terms of a linear operator mapping from a possibly infinit...
متن کاملA QUADRATIC MARGIN-BASED MODEL FOR WEIGHTING FUZZY CLASSIFICATION RULES INSPIRED BY SUPPORT VECTOR MACHINES
Recently, tuning the weights of the rules in Fuzzy Rule-Base Classification Systems is researched in order to improve the accuracy of classification. In this paper, a margin-based optimization model, inspired by Support Vector Machine classifiers, is proposed to compute these fuzzy rule weights. This approach not only considers both accuracy and generalization criteria in a single objective fu...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- IEEE Trans. Information Theory
دوره 47 شماره
صفحات -
تاریخ انتشار 2001